Model Selection for Regularized Least-Squares Algorithm in Learning Theory

نویسندگان

  • Ernesto De Vito
  • Andrea Caponnetto
  • Lorenzo Rosasco
چکیده

We investigate the problem of model selection for learning algorithms depending on a continuous parameter. We propose a model selection procedure based on a worst case analysis and data-independent choice of the parameter. For regularized least-squares algorithm we bound the generalization error of the solution by a quantity depending on few known constants and we show that the corresponding model selection procedure reduces to solving a bias-variance problem. Under suitable smoothness condition on the regression function, we estimate the optimal parameter as function of the number of data and we prove that this choice ensures consistency of the algorithm.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Risk Bounds and Model Selection for Regularized Least Squares

Motivation: A central problem in learning theory is a quantitative assessment of the inference property of a learning algorithm ensuring consistency. A number of seminal works show that the essential feature of an algorithm should be its capacity of controlling the complexity of the solution, this is usually realized introducing a parametric family of learning algorithms in which the parameters...

متن کامل

Predictable Sequences and Competing with Strategies

First, we study online learning with an extended notion of regret, which is defined with respect to a set of strategies. We develop tools for analyzing the minimax rates and deriving efficient learning algorithms in this scenario. While the standard methods for minimizing the usual notion of regret fail, through our analysis we demonstrate the existence of regret-minimization methods that compe...

متن کامل

My title

We present GURLS, a least squares, modular, easy-to-extend software library for efficient supervised learning. GURLS is targeted to machine learning practitioners, as well as nonspecialists. It offers a number state-of-the-art training strategies for medium and large-scale learning, and routines for efficient model selection. The library is particularly well suited for multi-output problems (mu...

متن کامل

GURLS: a least squares library for supervised learning

We present GURLS, a least squares, modular, easy-to-extend software library for efficient supervised learning. GURLS is targeted to machine learning practitioners, as well as non-specialists. It offers a number state-of-the-art training strategies for medium and large-scale learning, and routines for efficient model selection. The library is particularly well suited for multi-output problems (m...

متن کامل

A Unified Approach to Model Selection and Sparse Recovery Using Regularized Least Squares1 by Jinchi

Model selection and sparse recovery are two important problems for which many regularization methods have been proposed. We study the properties of regularization methods in both problems under the unified framework of regularized least squares with concave penalties. For model selection, we establish conditions under which a regularized least squares estimator enjoys a nonasymptotic property, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Foundations of Computational Mathematics

دوره 5  شماره 

صفحات  -

تاریخ انتشار 2005